Assessor Resource

ICTSAS514
Perform integration tests

Assessment tool

Version 1.0
Issue Date: May 2024


This unit describes the skills and knowledge required to ensure that the components of the system operate together to the expected standard.

It applies to senior development staff who are responsible for ensuring that sub-systems function correctly when combined.

No licensing, legislative, or certification requirements apply to this unit at the time of publication.

You may want to include more information here about the target group and the purpose of the assessments (eg formative, summative, recognition)



Evidence Required

List the assessment methods to be used and the context and resources required for assessment. Copy and paste the relevant sections from the evidence guide below and then re-write these in plain English.

ELEMENT

PERFORMANCE CRITERIA

Elements describe the essential outcomes.

Performance criteria describe the performance needed to demonstrate achievement of the element.

1. Prepare for test

1.1 Prepare test environment

1.2 Prepare test scripts (online test) or test run (batch test) for running

1.3 Review expected results against test and acceptance criteria

1.4 Confirm pre-existing modules and compile modification logs

1.5 Perform static tests of each point of integration and verify correctness of arguments, positional parameters and return values in each integration suite

1.6 Review results of earlier component testing and ensure critical issues are identified and considered

2. Conduct test

2.1 Select appropriate test tools

2.2 Run test scripts and document results against software life cycle model

2.3 Ensure that memory leakage, global name-space pollution and static variables are specifically addressed for each integration unit in line with test and acceptance criteria

2.4 Follow and adopt integration standards where appropriate in line with quality benchmarks

2.5 Compare test results to requirements on completion of each integration component

3. Analyse and classify results

3.1 Summarise and classify test results and highlight areas of concern

3.2 Compare test results against requirements and design specification, and prepare report

3.3 Notify operations of completion of testing where appropriate

3.4 Ensure attendees' details and comments are logged and signatures gained

3.5 Schedule and conduct a feedback meeting to discuss report and possible next actions with stakeholders if necessary

3.6 Ensure test reporting complies with documentation and reporting standards

Evidence of the ability to:

prepare the test environment

conduct tests using appropriate test tools and integration standards and quality benchmarks

perform integration requirements for the units

determine whether the units operate according to specifications

analyse and classify results

prepare reports that comply with documentation and reporting standards.

Note: Evidence must be provided on at least TWO systems or occasions.

To complete the unit requirements safely and effectively, the individual must:

describe the key features of at least two programming languages, with detailed knowledge of programming languages required by current system/project

compare and contrast automated test tools, with detailed knowledge of features and processes of tools used for current system/project

identify input and output requirements

discuss organisational practice, standards and benchmarks relating to integration testing

analyse and describe the system or application being tested

describe key features and processes of testing techniques and tools

analyse underlying test data.

Gather evidence to demonstrate consistent performance in conditions that are safe and replicate the workplace. Noise levels, production flow, interruptions and time variances must be typical of those experienced in the systems administration and support field of work, and include access to:

special purpose tools, equipment and materials

industry software packages

acceptance criteria

test plan

integration standards

requirements and design documents used in test analysis

system or application suitable for testing.

Assessors must satisfy NVR/AQTF assessor requirements.


Submission Requirements

List each assessment task's title, type (eg project, observation/demonstration, essay, assingnment, checklist) and due date here

Assessment task 1: [title]      Due date:

(add new lines for each of the assessment tasks)


Assessment Tasks

Copy and paste from the following data to produce each assessment task. Write these in plain English and spell out how, when and where the task is to be carried out, under what conditions, and what resources are needed. Include guidelines about how well the candidate has to perform a task for it to be judged satisfactory.

ELEMENT

PERFORMANCE CRITERIA

Elements describe the essential outcomes.

Performance criteria describe the performance needed to demonstrate achievement of the element.

1. Prepare for test

1.1 Prepare test environment

1.2 Prepare test scripts (online test) or test run (batch test) for running

1.3 Review expected results against test and acceptance criteria

1.4 Confirm pre-existing modules and compile modification logs

1.5 Perform static tests of each point of integration and verify correctness of arguments, positional parameters and return values in each integration suite

1.6 Review results of earlier component testing and ensure critical issues are identified and considered

2. Conduct test

2.1 Select appropriate test tools

2.2 Run test scripts and document results against software life cycle model

2.3 Ensure that memory leakage, global name-space pollution and static variables are specifically addressed for each integration unit in line with test and acceptance criteria

2.4 Follow and adopt integration standards where appropriate in line with quality benchmarks

2.5 Compare test results to requirements on completion of each integration component

3. Analyse and classify results

3.1 Summarise and classify test results and highlight areas of concern

3.2 Compare test results against requirements and design specification, and prepare report

3.3 Notify operations of completion of testing where appropriate

3.4 Ensure attendees' details and comments are logged and signatures gained

3.5 Schedule and conduct a feedback meeting to discuss report and possible next actions with stakeholders if necessary

3.6 Ensure test reporting complies with documentation and reporting standards

Evidence of the ability to:

prepare the test environment

conduct tests using appropriate test tools and integration standards and quality benchmarks

perform integration requirements for the units

determine whether the units operate according to specifications

analyse and classify results

prepare reports that comply with documentation and reporting standards.

Note: Evidence must be provided on at least TWO systems or occasions.

To complete the unit requirements safely and effectively, the individual must:

describe the key features of at least two programming languages, with detailed knowledge of programming languages required by current system/project

compare and contrast automated test tools, with detailed knowledge of features and processes of tools used for current system/project

identify input and output requirements

discuss organisational practice, standards and benchmarks relating to integration testing

analyse and describe the system or application being tested

describe key features and processes of testing techniques and tools

analyse underlying test data.

Gather evidence to demonstrate consistent performance in conditions that are safe and replicate the workplace. Noise levels, production flow, interruptions and time variances must be typical of those experienced in the systems administration and support field of work, and include access to:

special purpose tools, equipment and materials

industry software packages

acceptance criteria

test plan

integration standards

requirements and design documents used in test analysis

system or application suitable for testing.

Assessors must satisfy NVR/AQTF assessor requirements.

Copy and paste from the following performance criteria to create an observation checklist for each task. When you have finished writing your assessment tool every one of these must have been addressed, preferably several times in a variety of contexts. To ensure this occurs download the assessment matrix for the unit; enter each assessment task as a column header and place check marks against each performance criteria that task addresses.

Observation Checklist

Tasks to be observed according to workplace/college/TAFE policy and procedures, relevant legislation and Codes of Practice Yes No Comments/feedback
Prepare test environment 
Prepare test scripts (online test) or test run (batch test) for running 
Review expected results against test and acceptance criteria 
Confirm pre-existing modules and compile modification logs 
Perform static tests of each point of integration and verify correctness of arguments, positional parameters and return values in each integration suite 
Review results of earlier component testing and ensure critical issues are identified and considered 
Select appropriate test tools 
Run test scripts and document results against software life cycle model 
Ensure that memory leakage, global name-space pollution and static variables are specifically addressed for each integration unit in line with test and acceptance criteria 
Follow and adopt integration standards where appropriate in line with quality benchmarks 
Compare test results to requirements on completion of each integration component 
Summarise and classify test results and highlight areas of concern 
Compare test results against requirements and design specification, and prepare report 
Notify operations of completion of testing where appropriate 
3.4 Ensure attendees' details and comments are logged and signatures gained 
Schedule and conduct a feedback meeting to discuss report and possible next actions with stakeholders if necessary 
Ensure test reporting complies with documentation and reporting standards 

Forms

Assessment Cover Sheet

ICTSAS514 - Perform integration tests
Assessment task 1: [title]

Student name:

Student ID:

I declare that the assessment tasks submitted for this unit are my own work.

Student signature:

Result: Competent Not yet competent

Feedback to student

 

 

 

 

 

 

 

 

Assessor name:

Signature:

Date:


Assessment Record Sheet

ICTSAS514 - Perform integration tests

Student name:

Student ID:

Assessment task 1: [title] Result: Competent Not yet competent

(add lines for each task)

Feedback to student:

 

 

 

 

 

 

 

 

Overall assessment result: Competent Not yet competent

Assessor name:

Signature:

Date:

Student signature:

Date: